Adaptation of neuromagnetic N1 responses to phonetic stimuli by visual speech in humans.

نویسندگان

  • Iiro P Jääskeläinen
  • Ville Ojanen
  • Jyrki Ahveninen
  • Toni Auranen
  • Sari Levänen
  • Riikka Möttönen
  • Iina Tarnanen
  • Mikko Sams
چکیده

The technique of 306-channel magnetoencephalogaphy (MEG) was used in eight healthy volunteers to test whether silent lip-reading modulates auditory-cortex processing of phonetic sounds. Auditory test stimuli (either Finnish vowel /ae/ or /ø/) were preceded by a 500 ms lag by either another auditory stimulus (/ae/, /ø/ or the second-formant midpoint between /ae/ and /ø/), or silent movie of a person articulating /ae/ or /ø/. Compared with N1 responses to auditory /ae/ and /ø/ when presented without a preceding stimulus, the amplitudes of left-hemisphere N1 responses to the test stimuli were significantly suppressed both when preceded by auditory and visual stimuli, this effect being significantly stronger with preceding auditory stimuli. This suggests that seeing articulatory gestures of a speaker influences auditory speech perception by modulating the responsiveness of auditory-cortex neurons.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Early-latency categorical speech sound representations in the left inferior frontal gyrus

Efficient speech perception requires the mapping of highly variable acoustic signals to distinct phonetic categories. How the brain overcomes this many-to-one mapping problem has remained unresolved. To infer the cortical location, latency, and dependency on attention of categorical speech sound representations in the human brain, we measured stimulus-specific adaptation of neuromagnetic respon...

متن کامل

Electrophysiological evidence for speech-specific audiovisual integration.

Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) are indicative of speech-specific audiovisual integration, and we explored to what extent the ERPs were modulated by phonetic audiovisual congruency. In order to disentangle speech-specific (phonetic) integ...

متن کامل

Seeing speech: visual information from lip movements modifies activity in the human auditory cortex.

Neuromagnetic responses were recorded over the left hemisphere to find out in which cortical area the heard and seen speech are integrated. Auditory stimuli were Finnish/pa/syllables presented together with a videotaped face articulating either the concordant syllable/pa/(84% of stimuli, V = A) or the discordant syllable/ka/(16%, V not equal to A). In some subjects the probabilities were revers...

متن کامل

مشکلات جداسازی اصوات گفتاری همزمان در کودکان کم شنوا

Objective: This study was a basic investigation of the ability of concurrent speech segregation in hearing impaired children. Concurrent segregation is one of the fundamental components of auditory scene analysis and plays an important role in speech perception. In the present study, we compared auditory late responses or ALRs between hearing impaired and normal children. Materials & Methods...

متن کامل

Neurophysiologic correlates of cross-language phonetic perception.

This study examined neurophysiologic correlates of the perception of native and nonnative phonetic categories. Behavioral and electrophysiologic responses were obtained from Hindi and English listeners in response to a stimulus continuum of naturally produced, bilabial CV stimuli that differed in VOT from -90 to 0 ms. These speech sounds constitute phonemically relevant categories in Hindi but ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neuroreport

دوره 15 18  شماره 

صفحات  -

تاریخ انتشار 2004